General Functional Matrix Factorization Using Gradient Boosting
نویسندگان
چکیده
Matrix factorization is among the most successful techniques for collaborative filtering. One challenge of collaborative filtering is how to utilize available auxiliary information to improve prediction accuracy. In this paper, we study the problem of utilizing auxiliary information as features of factorization and propose formalizing the problem as general functional matrix factorization, whose model includes conventional matrix factorization models as its special cases. Moreover, we propose a gradient boosting based algorithm to efficiently solve the optimization problem. Finally, we give two specific algorithms for efficient feature function construction for two specific tasks. Our method can construct more suitable feature functions by searching in an infinite functional space based on training data and thus can yield better prediction accuracy. The experimental results demonstrate that the proposed method outperforms the baseline methods on three realworld datasets.
منابع مشابه
Local Topic Discovery via Boosted Ensemble of Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) has been increasingly popular for topic modeling of largescale documents. However, the resulting topics often represent only general, thus redundant information about the data rather than minor, but potentially meaningful information to users. To tackle this problem, we propose a novel ensemble model of nonnegative matrix factorization for discovering high...
متن کاملBoosting Methods: Why they can be useful for High-Dimensional Data
We present an extended abstract about boosting. We describe first in section 1 (in a self-contained way) a generic functional gradient descent algorithm, which yields a general representation of boosting. Properties of boosting or functional gradient descent are then very briefly summarized in section 2.
متن کاملPreconditioning with Matrix Factorization
Pietsch Factorization and Grothendieck Factorization are the two landmark theorems in modern functional analysis. They were first introduced to the numerical linear algebra community by the work of Joel A. Tropp in the column subset selection problem, which seeks to extract from a matrix a column submatrix that has lower spectral norm. Despite their broad application in functional analysis, the...
متن کاملOptimization by gradient boosting
Gradient boosting is a state-of-the-art prediction technique that sequentially produces a model in the form of linear combinations of simple predictors—typically decision trees—by solving an infinite-dimensional convex optimization problem. We provide in the present paper a thorough analysis of two widespread versions of gradient boosting, and introduce a general framework for studying these al...
متن کاملContextual Collaborative Filtering via Hierarchical Matrix Factorization
Matrix factorization (MF) has been demonstrated to be one of the most competitive techniques for collaborative filtering. However, state-of-the-art MFs do not consider contextual information, where ratings can be generated under different environments. For example, users select items under various situations, such as happy mood vs. sad, mobile vs. stationary, movies vs. book, etc. Under differe...
متن کامل